Federated Learning Hyper-Parameter Tuning From A System Perspective

نویسندگان

چکیده

Federated learning (FL) is a distributed model training paradigm that preserves clients’ data privacy. It has gained tremendous attention from both academia and industry. FL hyper-parameters (e.g., the number of selected clients passes) significantly affect overhead in terms computation time, transmission load, load. However, current practice manually selecting imposes heavy burden on practitioners because applications have different preferences. In this paper, we propose, an automatic hyper-parameter tuning algorithm tailored to applications’ diverse system requirements training. iteratively adjusts during can be easily integrated into existing systems. Through extensive evaluations for aggregation algorithms, show lightweight effective, achieving 8.48%-26.75% reduction compared using fixed hyper-parameters. This paper assists designing high-performance solutions. The source code available at.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lazy Paired Hyper-Parameter Tuning

In virtually all machine learning applications, hyper-parameter tuning is required to maximize predictive accuracy. Such tuning is computationally expensive, and the cost is further exacerbated by the need for multiple evaluations (via crossvalidation or bootstrap) at each configuration setting to guarantee statistically significant results. This paper presents a simple, general technique for i...

متن کامل

Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning

Kernelized learning algorithms have seen a steady growth in popularity during the last decades. The procedure to estimate the performances of these kernels in real applications is typical computationally demanding due to the process of hyper-parameter selection. This is especially true for graph kernels, which are computationally quite expensive. In this paper, we study an approach that substit...

متن کامل

a head parameter survey on mazandarani dialect and its effect(s) on learning english from ca perspective (on the basis of x-bar syntax)1

there has been a gradual shift of focus from the study of rule systems, which have increasingly been regarded as impoverished, … to the study of systems of principles, which appear to occupy a much more central position in determining the character and variety of possible human languages. there is a set of absolute universals, notions and principles existing in ug which do not vary from one ...

15 صفحه اول

Effects of Random Sampling on SVM Hyper-parameter Tuning

Hyper-parameter tuning is one of the crucial steps in the successful application of machine learning algorithms to real data. In general, the tuning process is modeled as an optimization problem for which several methods have been proposed. For complex algorithms, the evaluation of a hyper-parameter configuration is expensive and their runtime is speed up through data sampling. In this paper, t...

متن کامل

Tuning Metaheuristics - A Machine Learning Perspective

The tuning metaheuristics a machine learning perspective that we provide for you will be ultimate to give preference. This reading book is your chosen book to accompany you when in your free time, in your lonely. This kind of book can help you to heal the lonely and get or add the inspirations to be more inoperative. Yeah, book as the widow of the world can be very inspiring manners. As here, t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Internet of Things Journal

سال: 2023

ISSN: ['2372-2541', '2327-4662']

DOI: https://doi.org/10.1109/jiot.2023.3253813